Skip to main content

Full text of "Automated pilot advisory system test and evaluation at Manassas Municipal Airport"

See other formats


AUTOiVIATED PILOT ADVISORY SYSTEM TEST AND EVALUATION 

AT 
MANASSAS MUNICIPAL AIRPORT 



John L. Parks, Jr. 
Wallops Flight Center 
National Aeronautics and Space Administration 



INTRODUCTION 



It is anticipated that the growth of aviation in the next decade will 
occur primarily in general aviation thereby placing greater traffic demands on 
the uncontrolled airport system. Since Air Traffic Control (ATC) services are 
not normally provided at these airports, automated systems are being evaluated 
as a means of ensuring safe and orderly air traffic flow at high density 
uncontrolled airports. 

The National Aeronautics and Space Administration (NASA) , in cooperation 
with the Federal Aviation Administration (FAA), has developed an experimental 
Automated Pilot Advisory System (APAS) (reference 1) to provide airport and air 
traffic advisories at high density uncontrolled airports. The APAS concept is 
to utilize low cost automated systems to provide the necessary information for 
pilots to more safely plan and execute approach and landing at uncontrolled 
high density airports. The system is designed to be a natural extension of the 
procedural Visual Flight Rules (VFR) system used at uncontrolled airports and, 
as an advisory system, will enhance the "see-and-be-seen" ixile. 

The current system used at uncontrolled airports is for pilots to 
"self-announce" (traffic advisory) over a UNICOM radio channel and request 
the active runway (airport advisory) from the Fixed Base Operator (FBO) . The 
UNICOM radio channel is also used for general infonmation and requests, and 
can be shared by several different airports. For example, the UNICOM at 
Manassas airport is shared by Manassas, Montgomery County, Warrenton, and 
Freeport. The problems with this type of system are (1) not all pilots 
self-announce; (2) the active runway information may not be available (FBO 
may be absent from the radio performing other jobs, etc.); (3) there may be 
radio interference due to multiple transmissions; and (4) self -announcement 
at one airport may be interpreted by pilots at another airport. 

The experimental APAS was designed to be a test instrument in which its 
concept could be evaluated and experiments could be performed to determine the 
specifications for an operational system. Testing of the experimental system 
was initially performed at NASA's Wallops Flight Center (WFC) using NASA test 
pilots, but in late May 1980, the APAS was moved to Manassas ^ftlnicipal Airport, 



113 



Manassas, Virginia. This airport was selected because it is a high density- 
uncontrolled airport with an estimated 200,000 operations per year. From 
June 23, to August 16, 1980, the experimental APAS was operated daily between 
9 a.m. and 5 p.m. (9 a.m. to 10 p.m. the week of August 11), and an evaluation 
o£ the APAS concept was obtained from pilots who used the system. These 
evaluations and the system performance are presented. 



APAS DESIGN AND CONFIGURATION 



In order to implement the APAS concept, the APAS was required to have the 
following design features : 

(1) Low Cost - The system must be affordable to most of the county, municipal, 
or privately-owned airports in the nation. (A cost limit of $50,000 in 
1975 dollars was imposed for the APAS.) 

(2) Airport Advisory System - This system should be capable of: 

(a) Issuing a report at least once every two minutes which would include 
an airport identifier, time of day, favored runway, wind speed, 
direction and gust, altimeter setting, and ambient and dew point 
temperatures. 

(b) Automatically selecting the runway and having self-checking features. 

(c) Manual control over runway select and sensor fault via an operator 
control panel, 

(d) Handling at least five additional sensors. 

(3) Traffic Advisory System - This system should be capable of: 

(a) Issuing a report every 20 seconds to identify the number of aircraft 
on each pattern leg and the position, bearing, and heading of 
non-pattern aircraft. 

(b) Radar surveillance of a non-cooperative aircraft via a skin tracking 
radar. 

(c) Radar coverage to five nautical miles. 

(d) Height detection. 

(e) Reporting at least ten (10) aircraft and tracking at least twenty (20) 
aircraft. 

(4) Interface - The APAS should require only a standard Very High Frequency 
(VHF) radio. 



114 



To meet these requirements, the experimental APAS configuration (Figure 1) 
used at the Manassas airport included a radar set, mini- and micro- computers, 
weather sensors, a VHP transmitter, and an operator control panel. 

Ideally, an APAS radar system should have the following features: 
solid-state electronics, a Moving Target Indicator (MTI) or Doppler processor 
for ground clutter elimination, capable of detecting a 0.5 m^ target at three 
nautical miles with a 300 meter range resolution, and costing $30,000 in 
production runs. Studies performed (reference 2) to select the APAS radar 
indicated that MTI and Doppler type radars were either cost prohibitive 
(>$250,000) or had insufficient range capability (<1.8 nautical miles). From 
these studies, it was concluded that the most suitable radar for APAS was the 
Marine Pathfinder surveillance radar. This non-coherent radar is solid-state, 
except for a magnetron and modulator switch tube, and requires targets to be 
detected and tracked in a ground clutter environment. To accomplish this, 
the APAS used clutter suppression techniques (narrow beam width antennas. 
Sensitivity Time Control (STC) for each antenna, and a clutter screen set to 
attenuate Radio Frequency (RF) signals below two degrees elevation), and 
software target detection algorithms (clutter mapping, thresholding, and mean 
level) . 

A single transmit and multiple receive antenna were selected for the APAS 
to enable the system to detennine whether aircraft were at pattern altitude, 
above pattern altitude, or so high that they were of no interest. For the 
Manassas configuration, three receive antennas were used and set at 5, 10, and 
20 degrees elevation with beam widths of 4, 6, and 13 degrees, respectively. 
One antenna was scanned 360 degrees every two seconds resulting in a six second 
target update rate. (Under certain conditions, the signal returned from a 
target was received in two of the three antennas.) 

The mini- and micro- computers were used to provide target detection and 
tracking, pattern classification, evaluation of weather sensory data, and 
generation of audio voice messages for transmission to aircraft. The operator 
control panel provided manual control over runway selection and weather 
sensory status. 



SYSTEM PERFORMANCE EVALUATION 



An evaluation of the APAS performance in a high density uncontrolled 
environment was one of the primary objectives of the Manassas testing. The 
purpose of this evaluation was to determine the adequacy of system 
specifications and to ascertain whether any system degradation would occur due 
to high traffic density or other factors. The primary areas of concern were 
system cycle time, target detection, tracking, and message rates. 

The methods used to evaluate APAS performance included a continual 
verification of advisory reports and the maintenance of a system anomalies and 
pertinent data log. Additionally, during two 90-minute periods each day, all 
traffic advisory reports were recorded and a count was obtained of those reports 

115 



verified or unverified by radar or visual spotters. Throughout the six-week 
test period, 95 percent of the APAS reports were verified during the 90-minute 
counts. The breakdown on the five percent incorrect reports showed that one 
percent was loss of track on the final leg, one percent were late reports on 
departing aircraft, two percent were false tracks caused by large earth-moving 
equipment being used to construct a parallel runway, and one percent was for 
various other causes. The occurrences of the incorrect final and departure 
reports were enhanced by earth-moving equipment and site location problems 
unique to the experimental APAS. These two factors caused a higher- than-normal 
radar signal to be required for target detection, therefore, decreasing the 
probability of detection. It should be noted that the APAS software contains 
a computer code to eliminate problems produced by roadways, but it could not 
be utilized because the "roadway" for the earth-moving equipment was one-half 
mile wide. 

During the test period, the maximum traffic density occurred on Sunday, 
July 13, 1980. The total track rate, operation rate, and traffic report 
histogram data for this day are presented in Figures 2, 3, and 4, respectively. 
(Total track rate is the number of APAS validated tracks per hour; the 
operation rate is the sum of take-offs and landings per hour; the traffic 
report histogram depicts the number of traffic reports containing "N" number 
of aircraft) . This data indicates that the APAS operated for five hours at 
an operational rate exceeding 50 operations per hour with a peak rate of 70 
operations during a one-hour period. Additionally, the system reported its 
design limit of 10 aircraft on several occasions. System perfoimance 
measurements during this period indicated: (1) the two-second system cycle 
time was maintained; (2) no degradation occurred in traffic report accuracy 
rates (the highest accuracy rate achieved during the six-week test occurred 
during the five-hour high density period on July 13) ; and (3) the time for a 
traffic advisory message exceeded the 20- second period several times, but 
system software handled this situation by delaying the next advisory by the 
time overrun. 

During this test period, the APAS performance in marginal VFR conditions 
was mixed. On two occasions, during very hazy conditions, the APAS experienced 
no performance degradation; on other occasions, in light to moderate rain, the 
traffic advisory system was turned off because of numerous false target reports. 
The APAS contains computer software which detects the existence of rain and 
attempts to maintain pattern reports while deleting traffic reports outside 
the pattern in the area where the rain occurs. This software was used with 
favorable results on several occasions during isolated thunderstorms. 
Although the computer software in the experimental APAS did not contain the 
proper messages, it appears that the rain detection software could be expanded 
to handle the moderate rain problems. 

The experimental APAS had a seven- to-eighteen second system delay which 
resulted in aircraft completing a pattern leg turn being reported on the 
previous pattern leg. This time delay was caused by a combination of the 
traffic advisory reporting time, the six-second target update rate, and target 
coast mode following a missed detection. Initial users of APAS expressed 
concern about the delay, but pilots who continually used the system indicated 
that, if they didn't locate the traffic reported in a pattern leg, they would 

116 



instinctively look for traffic on the next pattern leg and, therefore, the 
delay wasn't a problem. 

During the APAS operational period, the Manassas UNICOM voice traffic 
was significantly reduced. This condition was illustrated by a comparison 
between the voice traffic which occurred immediately before to that which 
occurred during short periods in which APAS messages were terminated to store 
tracking data. During these periods pilots used the self-announcement system. 
Although measurements were not made to quantize it, the reduction was 
significant enough to make it obvious to those who monitored the UNICOM 
frequency. 

The only APAS anomaly occurred in the runway selection algorithm, which 
caused a runway change three times over a five minute period in light and 
variable winds. An analysis of the problem indicated that the number of runways 
impacted several input numbers in unforeseen ways. An immediate fix to the 
problem was implemented by changing the value of an input number, but this fix 
would negate the universality of the algorithm. A solution to this problem 
has been proposed but has not been tested. 



PILOT EVALUATION 



The second objective of the Manassas testing was to obtain pilot 
evaluations of the APAS concept in the uncontrolled high density environment. 
To accomplish this, the experimental APAS was operated for an eight-hour period 
each day for six weeks. An informational package, including a questionnaire 
was distributed to pilots who used the system and one hundred pilots responded to 
the questions (Q). Their responses (R) and an authors comment (C) are 
presented: 



Q: 


Date and time of experience? 


R: 


Not applicable. 


Q: 


Pilot Hours? 


R: 


50 - 5% 




100 - 6% 




200 - 12% 




500 - 18% 




1000 - 17% 




>1000 - 42% 


Q: 


a. Function? 



R: Pilot - 99% 
Co-Pilot - 1% 



117 



Rating? 



Private 


- 7% 


Commercial 


- 2% 


Instrument 


- 12% 


SEL 


- 28% 


Multiple 


- 51% 



Q: Type o£ aircraft? 

R: SEL - 81% 
MEL - 16% 
Other - 3% 

Q: APAS Voice Quality? 

R: Unusable - 0% 
Confusing - 1% 
Satisfactory - 39% 
Excellent - 53% 
Other - 7% (4% favorable and 3% unfavorable) 

Q: Was the airport advisory two minute rate satisfactory? 

R: Yes - 89% 
No - 11% 

C: Most of the no reponses occurred on hazy days when pilots indicated they 
needed favored runway information more often. The two-minute rate was 
insufficient because pilots were released from a controlled condition to VFR 
and tuned to the APAS broadcast after they had the airport in sight. 
Invariably, some pilots had to fly around the airport for almost two minutes 
to learn the favored runway from the next airport advisory. 

Q: Was the airport advisory message format acceptable? 

R: Yes - 92% 
No - 8% 

Q: Any improvements in airport advisory? 

R: No improvement - 38% 

Repeat runway more often - 12% 

Runway change confusing - 10% 
Temperature and dew point 

information not necessary - 6% 

Other - 34% 



118 



Q: a. Did you experience a change in active runway? 

R: Yes - 18% 
No - 82% 

C: The APAS selects the favored runway hy a technique which is a function of the 
prevailing winds. When conditions occur which produce a change in the 
favored runway, the APAS initiates the change by announcing it on the next 
airport advisory message. On each of the next six traffic advisory reports, 
which occur between airport advisories, the runway change is announced 
following the traffic report. The process is completed on the next airport 
advisory when the favored runway is announced to be the new one. 

Q: b. If so, describe your reaction. 

R: Dangerous - 22% 
Confusing - 28% 
Satisfactory - 28% 
Orderly - 22% 

C: Two occurrences contributed negative responses to this question. The first 
was the runway change anomoly described in the system performance evaluation 
where several aircraft were forced to taxi back-and-forth on the taxiway, 
while the APAS kept changing the favored runway. This occurrence caused 
several responses that the runway change method was confusing. 

The second occurrence resulted from a breakdown in control over the favored 
runway. Since controlling the runway would be part of any APAS evaluation, 
an agreement was made with the Manassas airport authorities, whereby the 
Manassas FBO would direct anyone requesting the favored runway to obtain the 
information from APAS broadcast. On two occasions this procedure failed 
and a favored runway, different than the one selected by APAS, was announced 
on the UNICOM frequency. On both occasions, the result produced was two 
aircraft simultaneously attempting to land on opposite runways. 
Announcements were made to divert the aircraft, but several "dangerous" 
responses were received from pilots. 

Q: Was the traffic advisory rate satisfactory? 

R: Yes - 89% 
No - 11% 

C: A non-limiting method was chosen to announce traffic information for the APAS. 
Non-pattern reports were ordered by azimuth so that pilots could differentiate 
potential conflicting and non-conflicting aircraft. This method would produce 
numerous target reports in high traffic densities so the next several 
questions were designed to evaluate the method. 

Q: a. Were you able to identify yourself in the traffic advisory? 

R: Yes - 95% 

No - 5% 

119 



b. How many other aircraft were being reported? 



1 - 


- 9% 


2 - 


- 13% 


3 - 


- 24% 


4 - 


- 19% 


5 - 


- 19% 


6 - 


- 10% 


7 ■ 


- 4% 


8 - 


- 1% 



Q: Were you able to locate all other traffic in the advisory? 

R: Yes - 46% 

No - 54% 

If no, were you able to locate all traffic presenting a potential conflict? 

Yes - 86% 
No - 14% 

Q: What is your opinion of the traffic advisory? 

R: Disastrous - 3% 
Confusing - 8% 
Satisfactory - 34% 
Wonderful - 30% 
Other - 25% (19% favorable and 6% unfavorable) 

Q: Did you experience any false target reports? 

R: Yes - 14% 
No - 86% 

If yes, was it a problem? 

Yes - 45% 
No - 55% 

Q: Did you site any traffic that was not reported by the system? 

R: Yes - 20% 
No - 80% 



Q: Was the traffic advisory information in a format that you fully understood? 

R: Yes - 95% 

No - 5% 



120 



Q: What is your opinion of the APAS messages vs. self-announcement? 



R: Favored APAS - 87.5% 

Favored self-announcement - 12.5% 

Q: Comments: 

R: Favorable - 86.5% 
Unfavorable - 13.5% 



C: The favorable comments indicated that pilots thought that APAS was a safer 

system than the self-announcement procedure. The unfavorable comments 

were in two general areas: system delay and lack of knowledge about 
pilot intentions. 



CONCLUDING REMARKS 



The testing at Manassas was the first attempt to evaluate an APAS in a high 
density uncontrolled environment. As a minimum, this test proved that low-cost 
automated systems can provide airport and air traffic advisory information at 
high density uncontrolled airports, and a large majority of the users preferred 
the APAS over a self -announcement procedure. 

The operational performance of the APAS indicated that additional 
investigations should be conducted in the following areas : 

Clutter Suppression . - - Enhancements in clutter suppression will decrease the 
false target report rate and could solve the final and departing aircraft 
reporting problem. The enhancements could be made in several ways, such as 
increasing the height of the antenna platform and optimizing the transmit 
and receive antennna elevation beam width. It is recognized that an MTI 
type of radar would solve the clutter problem, and this type radar may be 
required at some "trouble" airports, but the cost of this solution should be 
analyzed vs. system affordability. 

System Delay.- - Decreasing the system time delay appears feasible without 
significantly increasing system cost by using a dual receiver radar system and 
concurrently processing two receive antennas. It is suggested that the lowest 
elevation antenna be processed every cycle and the two upper elevation antennas 
be alternately processed. This method should result in a three-to-seven 
second system delay and have additional benefits such as to increase the range 
of initial target reporting and decrease the false target report frequency. 



121 



Channel Assignments . - - The decrease in UNICOM voice traffic during APAS 
operations and the APAS requirement of only a 10- to 20-nautical mile broadcast 
coverage area are significant factors in accessing frequency channel 
assignments for an operational system. Additional channels for the APAS 
broadcast may be obtained by assigning more uncontrolled airports the same 
UNICOM frequency. 

The initial objectives of the APAS program have been accomplished in that 
concept feasibility has been demonstrated and a system description can be 
defined. It is recommended that a Phase II program be initiated to incorporate 
the results of the Manassas testing into a follow-on system. 



122 



REFERENCES 



1. Parker, L. C. , et al. : "An Automated Pilot Advisory System," (A paper 

presented at the 1978 IEEE Region 3 Conference and Exhibit in Atlanta 
GA, April 10-12, 1978). 

2. Haidt, J, G., et al. : "Pilot Advisory System Feasibility Definition 

Study," Research Triangle Institute, February 1977. 



123 



O - 60° COVERAGE 
BY MULTIPLE FANS 







Figure 1.- Automated pilot advisory system. 



TRACKS 
PER HOUR 




10 11 12 13 14 15 16 
TIME OF DAY (HOURS) 

Figure 2.- Track rate - July 13, 1980. 



124 



OPERATIONS 
PER HOUR 




10 



11 



12 13 14 15 

TIME OF DAY (HOURS) 



LU 

CD 
< 



LU 

o 
a: 

LU 

□l 



20-1 



15- 



10- 



5- 



0-J 



Figure 3.- Operational rate - July 13, 1980. 



300 



CO 

H- 
DC 
O 

□L 
LU 

tr. 
u. 
O 
tr 

LU 
CD 



225- 



150 



75 




1 2345 678 9 10 11 
NUMBER OF AIRCRAFT REPORTED 



Figure 4.- Traffic report histogram - July 13, 1980. 



125